Speeding Up Feature Subset Selection Through Mutual Information Relevance Filtering

نویسندگان

  • Gert Van Dijck
  • Marc M. Van Hulle
چکیده

A relevance filter is proposed which removes features based on the mutual information between class labels and features. It is proven that both feature independence and class conditional feature independence are required for the filter to be statistically optimal. This could be shown by establishing a relationship with the conditional relative entropy framework for feature selection. Removing features at various significance levels as a preprocessing step to sequential forward search leads to a huge increase in speed, without a decrease in classification accuracy. These results are shown based on experiments with 5 high-dimensional publicly available gene expression data sets.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Speeding Up the Wrapper Feature Subset Selection in Regression by Mutual Information Relevance and Redundancy Analysis

A hybrid filter/wrapper feature subset selection algorithm for regression is proposed. First, features are filtered by means of a relevance and redundancy filter using mutual information between regression and target variables. We introduce permutation tests to find statistically significant relevant and redundant features. Second, a wrapper searches for good candidate feature subsets by taking...

متن کامل

Feature Weighting and Instance Selection for Collaborative Filtering

Collaborative filtering uses a database about consumers’ preferences to make personal product recommendations and is achieving widespread success in E-Commerce nowadays. In this paper, we present several feature-weighting methods to improve the accuracy of collaborative filtering algorithms. Furthermore, we propose to reduce the training data set by selecting only highly relevant instances. We ...

متن کامل

Speeding up Feature Selection by Using an Information Theoretic Bound

The paper proposes a technique for speeding up the search of the optimal set of features in classification problems where the input variables are discrete or nominal. The approach is based on the definition of an upper bound on the mutual information between the target and a set of d input variables. This bound is derived as a function of the mutual information of its subsets of d − 1 cardinali...

متن کامل

MIFS-ND: A mutual information-based feature selection method

Feature selection is used to choose a subset of relevant features for effective classification of data. In high dimensional data classification, the performance of a classifier often depends on the feature subset used for classification. In this paper, we introduce a greedy feature selection method using mutual information. This method combines both feature–feature mutual information and featur...

متن کامل

Mental Arithmetic Task Recognition Using Effective Connectivity and Hierarchical Feature Selection From EEG Signals

Introduction: Mental arithmetic analysis based on Electroencephalogram (EEG) signal for monitoring the state of the user’s brain functioning can be helpful for understanding some psychological disorders such as attention deficit hyperactivity disorder, autism spectrum disorder, or dyscalculia where the difficulty in learning or understanding the arithmetic exists. Most mental arithmetic recogni...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007